Improved Lower Bounds on Mutual Information Accounting for Nonlinear Signal–Noise Interaction

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lower bounds on mutual information.

We correct claims about lower bounds on mutual information (MI) between real-valued random variables made by Kraskov et al., Phys. Rev. E 69, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions. This is so in spite of the invariance of MI under reparametrizations, because linear correlations are not in...

متن کامل

Tighter Lower Bounds on Mutual Information for Fiber-Optic Channels

In fiber-optic communications, evaluation of mutual information (MI) is still an open issue due to the unavailability of an exact and mathematically tractable channel model. Traditionally, lower bounds on MI are computed by approximating the (original) channel with an auxiliary forward channel. In this paper, lower bounds are computed using an auxiliary backward channel, which has not been prev...

متن کامل

Tight RMR Lower Bounds for Mutual Exclusion

We investigate the remote memory references (RMRs) complexity of deterministic processes that communicate by reading and writing shared memory in asynchronous cache-coherent and distributed shared-memory multiprocessors. We define a class of algorithms that we call order encoding. By applying information-theoretic arguments, we prove that every order encoding algorithm, shared by n processes, h...

متن کامل

Improved Lower Bounds for Shellsort

We give improved lower bounds for Shellsort based on a new and relatively simple proof idea. The lower bounds obtained are both stronger and more general than the previously known bounds. In particular, they hold for nonmonotone increment sequences and adaptive Shellsort algorithms, as well as for some recently proposed variations of Shellsort.

متن کامل

Bounds on mutual information for simple codes using information combining

For coded transmission over a memoryless channel, two kinds of mutual information are considered: the mutual information between a code symbol and its noisy observation and the overall mutual information between encoder input and decoder output. The overall mutual information is interpreted as a combination of the mutual informations associated with the individual code symbols. Thus, exploiting...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Lightwave Technology

سال: 2018

ISSN: 0733-8724,1558-2213

DOI: 10.1109/jlt.2018.2869109